Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
The exponential growth of digital content has generated massive textual datasets, necessitating the use of advanced analytical approaches. Large Language Models (LLMs) have emerged as tools that are capable of processing and extracting insights from massive unstructured textual datasets. However, how to leverage LLMs for text analytics Information Systems (IS) research is currently unclear. To assist the IS community in understanding how to operationalize LLMs, we propose a Text Analytics for Information Systems Research (TAISR) framework. Our proposed framework provides detailed recommendations grounded in IS and LLM literature on how to conduct meaningful text analytics IS research for design science, behavioral, and econometric streams. We conducted three business intelligence case studies using our TAISR framework to demonstrate its application in several IS research contexts. We also outline the potential challenges and limitations of adopting LLMs for IS. By offering a systematic approach and evidence of its utility, our TAISR framework contributes to future IS research streams looking to incorporate powerful LLMs for text analytics.more » « lessFree, publicly-accessible full text available March 31, 2026
-
Abstract We prove the closure ordering conjecture on the local 𝐿-parameters of representations in local Arthur packets of \mathrm{G}_{n}=\mathrm{Sp}_{2n},\mathrm{SO}_{2n+1}over a non-Archimedean local field of characteristic zero.Precisely, given any representation 𝜋 in a local Arthur packet \Pi_{\psi}, the closure of the local 𝐿-parameter of 𝜋 in the Vogan variety must contain the local 𝐿-parameter corresponding to 𝜓.This conjecture reveals a geometric nature of local Arthur packets and is inspired by the work of Adams, Barbasch and Vogan, and the work of Cunningham, Fiori, Moussaoui, Mracek and Xu, on ABV-packets.As an application, for general quasi-split connected reductive groups, we show that the closure ordering conjecture implies the enhanced Shahidi conjecture, under certain reasonable assumptions.This provides a framework towards the enhanced Shahidi conjecture in general.We verify these assumptions for \mathrm{G}_{n}, hence give a new proof of the enhanced Shahidi conjecture.Finally, we show that local Arthur packets cannot be fully contained in other ones, which is in contrast to the situation over Archimedean local fields and is of independent interest.more » « lessFree, publicly-accessible full text available March 19, 2026
-
Abstract Recently, motivated by the theory of real local Arthur packets, making use of the wavefront sets of representations over non-Archimedean local fields $$F$$, Ciubotaru, Mason-Brown, and Okada defined the weak local Arthur packets consisting of certain unipotent representations and conjectured that they are unions of local Arthur packets. In this paper, we prove this conjecture for $$\textrm{Sp}_{2n}(F)$$ and split $$\textrm{SO}_{2n+1}(F)$$ with the assumption of the residue field characteristic of $$F$$ being large. In particular, this implies the unitarity of these unipotent representations. We also discuss the generalization of the weak local Arthur packets beyond unipotent representations, which reveals the close connection with a conjecture of Jiang on the structure of wavefront sets for representations in local Arthur packets.more » « less
-
Ravikumar, Pradeep (Ed.)Data augmentation (DA) is a powerful workhorse for bolstering performance in modern machine learning. Specific augmentations like translations and scaling in computer vision are traditionally believed to improve generalization by generating new (artificial) data from the same distribution. However, this traditional viewpoint does not explain the success of prevalent augmentations in modern machine learning (e.g. randomized masking, cutout, mixup), that greatly alter the training data distribution. In this work, we develop a new theoretical framework to characterize the impact of a general class of DA on underparameterized and overparameterized linear model generalization. Our framework reveals that DA induces implicit spectral regularization through a combination of two distinct effects: a) manipulating the relative proportion of eigenvalues of the data covariance matrix in a training-data-dependent manner, and b) uniformly boosting the entire spectrum of the data covariance matrix through ridge regression. These effects, when applied to popular augmentations, give rise to a wide variety of phenomena, including discrepancies in generalization between over-parameterized and under-parameterized regimes and differences between regression and classification tasks. Our framework highlights the nuanced and sometimes surprising impacts of DA on generalization, and serves as a testbed for novel augmentation design.more » « less
An official website of the United States government

Full Text Available